4,192 research outputs found

    Regulating Cancer Stem Cells the miR Way

    Get PDF
    A recent study in Nature Cell Biology, Wellner et al. (2009) identifies ZEB1, a known promoter of tumor invasion, as a negative regulator of miRNA clusters that target stem cell factors. These findings provide new insight into the network of transcription factors and miRNAs that regulate cancer stem cells

    Clinical review: Strict or loose glycemic control in critically ill patients - implementing best available evidence from randomized controlled trials

    Get PDF
    Glycemic control aiming at normoglycemia, frequently referred to as 'strict glycemic control' (SGC), decreased mortality and morbidity of adult critically ill patients in two randomized controlled trials (RCTs). Five successive RCTs, however, failed to show benefit of SGC with one trial even reporting an unexpected higher mortality. Consequently, enthusiasm for the implementation of SGC has declined, hampering translation of SGC into daily ICU practice. In this manuscript we attempt to explain the variances in outcomes of the RCTs of SGC, and point out other limitations of the current literature on glycemic control in ICU patients. There are several alternative explanations for why the five negative RCTs showed no beneficial effects of SGC, apart from the possibility that SGC may indeed not benefit ICU patients. These include, but are not restricted to, variability in the performance of SGC, differences among trial designs, changes in standard of care, differences in timing (that is, initiation) of SGC, and the convergence between the intervention groups and control groups with respect to achieved blood glucose levels in the successive RCTs. Additional factors that may hamper translation of SGC into daily ICU practice include the feared risk of severe hypoglycemia, additional labor associated with SGC, and uncertainties about who the primarily responsible caregiver should be for the implementation of SGC

    Abnormal connectional fingerprint in schizophrenia: a novel network analysis of diffusion tensor imaging data

    Get PDF
    The graph theoretical analysis of structural magnetic resonance imaging (MRI) data has received a great deal of interest in recent years to characterize the organizational principles of brain networks and their alterations in psychiatric disorders, such as schizophrenia. However, the characterization of networks in clinical populations can be challenging, since the comparison of connectivity between groups is influenced by several factors, such as the overall number of connections and the structural abnormalities of the seed regions. To overcome these limitations, the current study employed the whole-brain analysis of connectional fingerprints in diffusion tensor imaging data obtained at 3 T of chronic schizophrenia patients (n = 16) and healthy, age-matched control participants (n = 17). Probabilistic tractography was performed to quantify the connectivity of 110 brain areas. The connectional fingerprint of a brain area represents the set of relative connection probabilities to all its target areas and is, hence, less affected by overall white and gray matter changes than absolute connectivity measures. After detecting brain regions with abnormal connectional fingerprints through similarity measures, we tested each of its relative connection probability between groups. We found altered connectional fingerprints in schizophrenia patients consistent with a dysconnectivity syndrome. While the medial frontal gyrus showed only reduced connectivity, the connectional fingerprints of the inferior frontal gyrus and the putamen mainly contained relatively increased connection probabilities to areas in the frontal, limbic, and subcortical areas. These findings are in line with previous studies that reported abnormalities in striatal–frontal circuits in the pathophysiology of schizophrenia, highlighting the potential utility of connectional fingerprints for the analysis of anatomical networks in the disorder

    Integrating Temporal and Spatial Scales: Human Structural Network Motifs Across Age and Region of Interest Size

    Get PDF
    Human brain networks can be characterized at different temporal or spatial scales given by the age of the subject or the spatial resolution of the neuroimaging method. Integration of data across scales can only be successful if the combined networks show a similar architecture. One way to compare networks is to look at spatial features, based on fiber length, and topological features of individual nodes where outlier nodes form single node motifs whose frequency yields a fingerprint of the network. Here, we observe how characteristic single node motifs change over age (12–23 years) and network size (414, 813, and 1615 nodes) for diffusion tensor imaging structural connectivity in healthy human subjects. First, we find the number and diversity of motifs in a network to be strongly correlated. Second, comparing different scales, the number and diversity of motifs varied across the temporal (subject age) and spatial (network resolution) scale: certain motifs might only occur at one spatial scale or for a certain age range. Third, regions of interest which show one motif at a lower resolution may show a range of motifs at a higher resolution which may or may not include the original motif at the lower resolution. Therefore, both the type and localization of motifs differ for different spatial resolutions. Our results also indicate that spatial resolution has a higher effect on topological measures whereas spatial measures, based on fiber lengths, remain more comparable between resolutions. Therefore, spatial resolution is crucial when comparing characteristic node fingerprints given by topological and spatial network features. As node motifs are based on topological and spatial properties of brain connectivity networks, these conclusions are also relevant to other studies using connectome analysis

    Tight glycaemic control: intelligent technology or a nurse-wise strategy?

    Get PDF
    Despite disappointing findings with the computerized decision-supported tight glycaemic control (TGC) protocol, Shulman and colleagues [1] argue that one reason to proceed with computerized TGC protocols is that complex protocols remain mandatory for TGC. Indeed, most intensivists think of TGC as difficult and complex. In The Netherlands as many as 46 different protocols are in use, including protocols with flowcharts, sliding scales, calculators and conversion tables as well as computerized decisionsupport protocols (survey, de Graaff MJ, Royakkers AANM, Kieft H, Spronk PE, van der Sluijs HP, Schultz MJ, unpublished data); they all are exceptionally complex and frequently difficult to follow. We recently had the opportunity to visit the Leuven hospital and were surprised to see their protocol, which is remarkably concise, far from complex, an

    Effects of Field Borders and Mesomammal Reduction on Northern Bobwhite and Songbird Abundance on Three Farms in North Carolina (Oral Abstract)

    Get PDF
    Lack of early nesting habitat may be limiting population levels of northern bobwhites (Colinus virginianus) and early successional songbirds on agricultural landscapes. Alternatively, detrimental effects of mesomammal predators on nesting success and survivorship of bobwhites may be causal at low densities. Previous research has documented increased use of agricultural areas by bobwhites on farms with field borders, but bobwhites had low nesting success in these areas. No replicated studies in the southeast United States have been conducted investigating the effects of field borders and mesomammal predator reduction on bobwhite and songbird abundance. We conducted a 3-year study on farms in Hyde, Tyrrell, and Wilson counties, North Carolina using a 2 x 2 factorial treatment combinations and a blocked study design. On each study area, 4, 200-ha farm blocks were randomly assigned 1 of 4 treatments. Treatments included: (1) 5–10 m fallow vegetation borders on all disked field edges, (2) removal of mesomammal nest predators (raccoons (Procyon lotor), opossums (Didelphis virginianus), and foxes (Urocyon cinereoargenteus and Vulpes vulpes)) between January through June of each year, (3) a combination of field borders and predator reduction, or (4) neither treatment. In 1997–99, we measured fall abundance of bobwhite coveys on farm blocks using morning covey call surveys and summer abundance of songbirds using variable radius point counts. Field borders were established in 1996 in Hyde and Wilson counties and 1997 in the Tyrrell county study area. Number of mesomammal predators annually removed from farm blocks averaged 42 (SE = 3.5) and was similar between study areas and years. Field border farm blocks had consistently more coveys heard than non-border farm blocks (F1,2 = 216.0, P \u3c 0.004). However, there were no differences in the number of coveys heard between predator reduction and non-reduction farms (F1,2 =10.4, P = 0.084). Farms with both field border and predator reduction had more coveys heard compared to other farm blocks (F1,2 = 43.3, P \u3c 0.0223). Summer bobwhite abundance was greater on field border areas (F1,6 = 5.93, P \u3c 0.051). No other differences in songbird abundance were detected between field border and non-border farms. In 1997, songbird nest density was estimated in field border and non-border farms on the Wilson County study area. Field border farms had higher nest density, particularly for field sparrows (Spizella pusilla) and common yellowthroats (Geothlypis trichas), and had greater nesting bird diversity. Field borders were a practical technique to increase bobwhite abundance on small farm blocks. Increases in bobwhite abundance associated with predator reduction on small farms with field border would not be economically feasible in most circumstances

    Innovation in patient-centered care: lessons from a qualitative study of innovative health care organizations in Washington State

    Get PDF
    Background Growing interest in the promise of patient-centered care has led to numerous health care innovations, including the patient-centered medical home, shared decision-making, and payment reforms. How best to vet and adopt innovations is an open question. Washington State has been a leader in health care reform and is a rich laboratory for patient-centered innovations. We sought to understand the process of patient-centered care innovation undertaken by innovative health care organizations – from strategic planning to goal selection to implementation to maintenance. Methods We conducted key-informant interviews with executives at five health plans, five provider organizations, and ten primary care clinics in Washington State. At least two readers of each interview transcript identified themes inductively; final themes were determined by consensus. Results Innovation in patient-centered care was a strategic objective chosen by nearly every organization in this study. However, other goals were paramount: cost containment, quality improvement, and organization survival. Organizations commonly perceived effective chronic disease management and integrated health information technology as key elements for successful patient-centered care innovation. Inertia, resource deficits, fee-for-service payment, and regulatory limits on scope of practice were cited as barriers to innovation, while organization leadership, human capital, and adaptive culture facilitated innovation. Conclusions Patient-centered care innovations reflected organizational perspectives: health plans emphasized cost-effectiveness while providers emphasized health care delivery processes. Health plans and providers shared many objectives, yet the two rarely collaborated to achieve them. The process of innovation is heavily dependent on organizational culture and leadership. Policymakers can improve the pace and quality of patient-centered innovation by setting targets and addressing conditions for innovation

    Predicting Surgery Targets in Temporal Lobe Epilepsy through Structural Connectome Based Simulations

    Get PDF
    Temporal lobe epilepsy (TLE) is a prevalent neurological disorder resulting in disruptive seizures. In the case of drug resistant epilepsy resective surgery is often considered. This is a procedure hampered by unpredictable success rates, with many patients continuing to have seizures even after surgery. In this study we apply a computational model of epilepsy to patient specific structural connectivity derived from diffusion tensor imaging (DTI) of 22 individuals with left TLE and 39 healthy controls. We validate the model by examining patient-control differences in simulated seizure onset time and network location. We then investigate the potential of the model for surgery prediction by performing in silico surgical resections, removing nodes from patient networks and comparing seizure likelihood post-surgery to pre-surgery simulations. We find that, first, patients tend to transit from non-epileptic to epileptic states more often than controls in the model. Second, regions in the left hemisphere (particularly within temporal and subcortical regions) that are known to be involved in TLE are the most frequent starting points for seizures in patients in the model. In addition, our analysis also implicates regions in the contralateral and frontal locations which may play a role in seizure spreading or surgery resistance. Finally, the model predicts that patient-specific surgery (resection areas chosen on an individual, model-prompted, basis and not following a predefined procedure) may lead to better outcomes than the currently used routine clinical procedure. Taken together this work provides a first step towards patient specific computational modelling of epilepsy surgery in order to inform treatment strategies in individuals
    corecore